261 research outputs found

    Information and communication in polygon theories

    Full text link
    Generalized probabilistic theories (GPT) provide a framework in which one can formulate physical theories that includes classical and quantum theories, but also many other alternative theories. In order to compare different GPTs, we advocate an approach in which one views a state in a GPT as a resource, and quantifies the cost of interconverting between different such resources. We illustrate this approach on polygon theories (Janotta et al. New J. Phys 13, 063024, 2011) that interpolate (as the number n of edges of the polygon increases) between a classical trit (when n=3) and a real quantum bit (when n=infinity). Our main results are that simulating the transmission of a single n-gon state requires more than one qubit, or more than log(log(n)) bits, and that n-gon states with n odd cannot be simulated by n'-gon states with n' even (for all n,n'). These results are obtained by showing that the classical capacity of a single n-gon state with n even is 1 bit, whereas it is larger than 1 bit when n is odd; by showing that transmitting a single n-gon state with n even violates information causality; and by showing studying the communication complexity cost of the nondeterministic not equal function using n-gon states.Comment: 18 page

    Uncertainty Relation for the Discrete Fourier Transform

    Full text link
    We derive an uncertainty relation for two unitary operators which obey a commutation relation of the form UV=exp[i phi] VU. Its most important application is to constrain how much a quantum state can be localised simultaneously in two mutually unbiased bases related by a Discrete Fourier Transform. It provides an uncertainty relation which smoothly interpolates between the well known cases of the Pauli operators in 2 dimensions and the continuous variables position and momentum. This work also provides an uncertainty relation for modular variables, and could find applications in signal processing. In the finite dimensional case the minimum uncertainty states, discrete analogues of coherent and squeezed states, are minimum energy solutions of Harper's equation, a discrete version of the Harmonic oscillator equation.Comment: Extended Version; 13 pages; In press in Phys. Rev. Let

    Hyperdense coding and superadditivity of classical capacities in hypersphere theories

    Full text link
    In quantum superdense coding, two parties previously sharing entanglement can communicate a two bit message by sending a single qubit. We study this feature in the broader framework of general probabilistic theories. We consider a particular class of theories in which the local state space of the communicating parties corresponds to Euclidean hyperballs of dimension n (the case n = 3 corresponds to the Bloch ball of quantum theory). We show that a single n-ball can encode at most one bit of information, independently of n. We introduce a bipartite extension of such theories for which there exist dense coding protocols such that log_2 (n+1) bits are communicated if entanglement is previously shared by the communicating parties. For n > 3, these protocols are more powerful than the quantum one, because more than two bits are communicated by transmission of a system that locally encodes at most one bit. We call this phenomenon hyperdense coding. Our hyperdense coding protocols imply superadditive classical capacities: two entangled systems can encode log_2 (n+1) > 2 bits, even though each system individually encodes at most one bit. In our examples, hyperdense coding and superadditivity of classical capacities come at the expense of violating tomographic locality or dynamical continuous reversibility.Comment: Expanded discussion in response to referee comments. Accepted for publication in New Journal of Physic

    The Extent of Multi-particle Quantum Non-locality

    Full text link
    It is well known that entangled quantum states can be nonlocal: the correlations between local measurements carried out on these states cannot always be reproduced by local hidden variable models. Svetlichny, followed by others, showed that multipartite quantum states are even more nonlocal than bipartite ones in the sense that nonlocal classical models with (super-luminal) communication between some of the parties cannot reproduce the quantum correlations. Here we study in detail the kinds of nonlocality present in quantum states. More precisely we enquire what kinds of classical communication patterns cannot reproduce quantum correlations. By studying the extremal points of the space of all multiparty probability distributions, in which all parties can make one of a pair of measurements each with two possible outcomes, we find a necessary condition for classical nonlocal models to reproduce the statistics of all quantum states. This condition extends and generalises work of Svetlichny and others in which it was shown that a particular class of classical nonlocal models, the ``separable'' models, cannot reproduce the statistics of all multiparticle quantum states. Our condition shows that the nonlocality present in some entangled multiparticle quantum states is much stronger than previously thought. We also study the sufficiency of our condition.Comment: 10 pages, 2 figures, journal versio

    Device-Independent Bit Commitment based on the CHSH Inequality

    Full text link
    Bit commitment and coin flipping occupy a unique place in the device-independent landscape, as the only device-independent protocols thus far suggested for these tasks are reliant on tripartite GHZ correlations. Indeed, we know of no other bipartite tasks, which admit a device-independent formulation, but which are not known to be implementable using only bipartite nonlocality. Another interesting feature of these protocols is that the pseudo-telepathic nature of GHZ correlations -- in contrast to the generally statistical character of nonlocal correlations, such as those arising in the violation of the CHSH inequality -- is essential to their formulation and analysis. In this work, we present a device-independent bit commitment protocol based on CHSH testing, which achieves the same security as the optimal GHZ-based protocol. The protocol is analyzed in the most general settings, where the devices are used repeatedly and may have long-term quantum memory. We also recast the protocol in a post-quantum setting where both honest and dishonest parties are restricted only by the impossibility of signaling, and find that overall the supra-quantum structure allows for greater security.Comment: 15 pages, 3 figure

    A Primer for Black Hole Quantum Physics

    Full text link
    The mechanisms which give rise to Hawking radiation are revealed by analyzing in detail pair production in the presence of horizons. In preparation for the black hole problem, three preparatory problems are dwelt with at length: pair production in an external electric field, thermalization of a uniformly accelerated detector and accelerated mirrors. In the light of these examples, the black hole evaporation problem is then presented. The leitmotif is the singular behavior of modes on the horizon which gives rise to a steady rate of production. Special emphasis is put on how each produced particle contributes to the mean albeit arising from a particular vacuum fluctuation. It is the mean which drives the semiclassical back reaction. This aspect is analyzed in more detail than heretofore and in particular its drawbacks are emphasized. It is the semiclassical theory which gives rise to Hawking's famous equation for the loss of mass of the black hole due to evaporation dM/dt1/M2dM/dt \simeq -1/M^2. Black hole thermodynamics is derived from the evaporation process whereupon the reservoir character of the black hole is manifest. The relation to the thermodynamics of the eternal black hole through the Hartle--Hawking vacuum and the Killing identity are displayed. It is through the analysis of the fluctuations of the field configurations which give rise to a particular Hawking photon that the dubious character of the semiclassical theory is manifest. The present frontier of research revolves around this problem and is principally concerned with the fact that one calls upon energy scales that are greater than Planckian and the possibility of a non unitary evolution as well. These last subjects are presented in qualitative fashion only, so that this review stops at the threshold of quantum gravity.Comment: An old review article on black hole evaporation and black hole thermodynamics, put on the archive following popular demand, 178 pages, 21 figures (This text differs in slightly from the published version

    Optimality of the genetic code with respect to protein stability and amino acid frequencies

    Get PDF
    How robust is the natural genetic code with respect to mistranslation errors? It has long been known that the genetic code is very efficient in limiting the effect of point mutation. A misread codon will commonly code either for the same amino acid or for a similar one in terms of its biochemical properties, so the structure and function of the coded protein remain relatively unaltered. Previous studies have attempted to address this question more quantitatively, namely by statistically estimating the fraction of randomly generated codes that do better than the genetic code regarding its overall robustness. In this paper, we extend these results by investigating the role of amino acid frequencies in the optimality of the genetic code. When measuring the relative fitness of the natural code with respect to a random code, it is indeed natural to assume that a translation error affecting a frequent amino acid is less favorable than that of a rare one, at equal mutation cost. We find that taking the amino acid frequency into account accordingly decreases the fraction of random codes that beat the natural code, making the latter comparatively even more robust. This effect is particularly pronounced when more refined measures of the amino acid substitution cost are used than hydrophobicity. To show this, we devise a new cost function by evaluating with computer experiments the change in folding free energy caused by all possible single-site mutations in a set of known protein structures. With this cost function, we estimate that of the order of one random code out of 100 millions is more fit than the natural code when taking amino acid frequencies into account. The genetic code seems therefore structured so as to minimize the consequences of translation errors on the 3D structure and stability of proteins.Comment: 31 pages, 2 figures, postscript fil

    Resource efficient single photon source based on active frequency multiplexing

    Full text link
    We propose a new single photon source based on the principle of active multiplexing of heralded single photons which, unlike previously reported architecture, requires a limited amount of physical resources. We discuss both its feasibility and the purity and indistinguishability of single photons as function of the key parameters of a possible implementation
    corecore